Entropy Rate for Hidden Markov Chains with rare transitions

نویسندگان

  • Yuval Peres
  • Anthony Quas
چکیده

We consider Hidden Markov Chains obtained by passing a Markov Chain with rare transitions through a noisy memoryless channel. We obtain asymptotic estimates for the entropy of the resulting Hidden Markov Chain as the transition rate is reduced to zero. Let (Xn) be a Markov chain with finite state space S and transition matrix P (p) and let (Yn) be the Hidden Markov chain observed by passing (Xn) through a homogeneous noisy memoryless channel (i.e. Y takes values in a set T , and there exists a matrix Q such that P(Yn = j|Xn = i,Xn−1 −∞ , X∞ n+1, Y n−1 −∞ , Y∞ n+1) = Qij). We make the additional assumption on the channel that the rows of Q are distinct. In this case we call the channel statistically distinguishing. Finally we assume that P (p) is of the form I + pA where A is a matrix with negative entries on the diagonal, non-negative entries in the off-diagonal terms and zero row sums. We further assume that for small positive p, the Markov chain with transition matrix P (p) is irreducible. Notice that for Markov chains of this form, the invariant distribution (πi)i∈S does not depend on p. In this case, we say that for small positive values of p, the Markov chain is in a rare transition regime. We will adopt the convention that H is used to denote the entropy of a finite partition, whereas h is used to denote the entropy of a process (the entropy rate in information theory terminology). Given an irreducible Markov chain with transition matrix P , we let h(P ) be the entropy of the Markov chain (i.e. h(P ) = − ∑ i,j πiPij logPij where πi is the (unique) invariant distribution of the Markov chain and where as usual we adopt the convention that 0 log 0 = 0). We also let Hchan(i) be the entropy of the output of the channel when the input symbol is i (i.e. Hchan(i) = − ∑ j∈T Qij logQij ). Let h(Y ) denote the entropy of Y (i.e. h(Y ) = − limN→∞ 1 N ∑ w∈TN P(Y N 1 = w) log P(Y N 1 = w)) Theorem 1. Consider the Hidden Markov Chain (Yn) obtained by observing a Markov chain with irreducible transition matrix P (p) = I+Ap through a statistically distinguishing channel with transition matrix Q. Then there exists a constant C > 0 such that for all small p > 0,

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Taylor Expansion for the Entropy Rate of Hidden Markov Chains

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

متن کامل

Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain

 In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...

متن کامل

The Rate of Rényi Entropy for Irreducible Markov Chains

In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.

متن کامل

Asymptoti Filtering and Entropy Rate of a Hidden Markov Pro ess in the Rare Transitions Regime

Markov Pro ess in the Rare Transitions Regime Chandra Nair Dept. of Ele t. Engg. Stanford University Stanford CA 94305, USA m handra stanford.edu Erik Ordentli h Information Theory Resear h Group HP Laboratories Palo Alto CA 94304, USA erik.ordentli h hp. om Tsa hy Weissman Dept. of Ele t. Engg. Stanford University Stanford CA 94305, USA tsa hy stanford.edu Abstra t—Re ent work by Ordentli h an...

متن کامل

Estimation of the Entropy Rate of ErgodicMarkov Chains

In this paper an approximation for entropy rate of an ergodic Markov chain via sample path simulation is calculated. Although there is an explicit form of the entropy rate here, the exact computational method is laborious to apply. It is demonstrated that the estimated entropy rate of Markov chain via sample path not only converges to the correct entropy rate but also does it exponential...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1012.2086  شماره 

صفحات  -

تاریخ انتشار 2009